Convergence rates of Kernel Conjugate Gradient for random design regression

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kernel Conjugate Gradient for Fast Kernel Machines

We propose a novel variant of the conjugate gradient algorithm, Kernel Conjugate Gradient (KCG), designed to speed up learning for kernel machines with differentiable loss functions. This approach leads to a better conditioned optimization problem during learning. We establish an upper bound on the number of iterations for KCG that indicates it should require less than the square root of the nu...

متن کامل

Kernel Conjugate Gradient

We propose a novel variant of conjugate gradient based on the Reproducing Kernel Hilbert Space (RKHS) inner product. An analysis of the algorithm suggests it enjoys better performance properties than standard iterative methods when applied to learning kernel machines. Experimental results for both classification and regression bear out the theoretical implications. We further address the domina...

متن کامل

A unified convergence bound for conjugate gradient and accelerated gradient∗

Nesterov’s accelerated gradient method for minimizing a smooth strongly convex function f is known to reduce f(xk) − f(x∗) by a factor of ∈ (0, 1) after k ≥ O( √ L/` log(1/ )) iterations, where `, L are the two parameters of smooth strong convexity. Furthermore, it is known that this is the best possible complexity in the function-gradient oracle model of computation. The method of linear conju...

متن کامل

Convergence Properties of Nonlinear Conjugate Gradient Methods

Recently, important contributions on convergence studies of conjugate gradient methods have been made by Gilbert and Nocedal [6]. They introduce a “sufficient descent condition” to establish global convergence results, whereas this condition is not needed in the convergence analyses of Newton and quasi-Newton methods, [6] hints that the sufficient descent condition, which was enforced by their ...

متن کامل

Gradient-based kernel dimension reduction for regression

This paper proposes a novel approach to linear dimension reduction for regression using nonparametric estimation with positive definite kernels or reproducing kernel Hilbert spaces. The purpose of the dimension reduction is to find such directions in the explanatory variables that explain the response sufficiently: this is called sufficient dimension reduction. The proposed method is based on a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Analysis and Applications

سال: 2016

ISSN: 0219-5305,1793-6861

DOI: 10.1142/s0219530516400017